Variance Estimation for the General Regression Estimator

نویسنده

  • RICHARD VALLIANT
چکیده

A variety of estimators of the variance of the general regression (GREG) estimator of a mean have been proposed in the sampling literature, mainly with the goal of estimating the design-based variance. Estimators can be easily constructed that, under certain conditions, are approximately unbiased for both the design-variance and the model-variance. Several dual-purpose estimators are studied here in single-stage sampling. These choices are robust estimators of a model-variance even if the model that motivates the GREG has an incorrect variance parameter. A key feature of the robust estimators is the adjustment of squared residuals by factors analogous to the leverages used in standard regression analysis. We also show that the delete-one jackknife implicitly includes the leverage adjustments and is a good choice from either the design-based or model-based perspective. In a set of simulations, these variance estimators have small bias and produce confidence intervals with near-nominal coverage rates for several sampling methods, sample sizes, and populations in single-stage sampling. We also present simulation results for a skewed population where all variance estimators perform poorly. Samples that do not adequately represent the units with large values lead to estimated means that are too small, variance estimates that are too small, and confidence intervals that cover at far less than the nominal rate. These defects need to be avoided at the design stage by selecting samples that cover the extreme units well. However, in populations with inadequate design information this will not be feasible.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multivariate Local Polynomial Kernel Estimators: Leading Bias and Asymptotic Distribution∗

Masry (1996b) provides estimation bias and variance expression for a general local polynomial kernel estimator in a general multivariate regression framework. Under smoother conditions on the unknown regression and by including more refined approximation terms than that in Masry (1996b), we extend the result of Masry (1996b) to obtain explicit leading bias terms for the whole vector of the loca...

متن کامل

An Empirical Comparison of Performance of the Unified Approach to Linearization of Variance Estimation after Imputation with Some Other Methods

Imputation is one of the most common methods to reduce item non_response effects. Imputation results in a complete data set, and then it is possible to use naϊve estimators. After using most of common imputation methods, mean and total (imputation estimators) are still unbiased. However their variances (imputation variances) are underestimated by naϊve variance estimators. Sampling mechanism an...

متن کامل

Change Point Estimation of a Process Variance with a Linear Trend Disturbance

When a change occurs in a process, one expects to receive a signal from a control chart as quickly as possible. Upon the receipt of signal from the control chart a search for identifying the source of disturbance begins. However, searching for assignable cause around the signal time, due to the fact that the disturbance may have manifested itself into the rocess sometimes back, may not always l...

متن کامل

Reducing Variance in Univariate Smoothing

A variance reduction technique in nonparametric smoothing is proposed: at each point of estimation, form a linear combination of a preliminary estimator evaluated at nearby points with the coefficients specified so that the asymptotic bias remains unchanged. The nearby points are chosen to maximize the variance reduction. We study in detail the case of univariate local linear regression. While ...

متن کامل

The Structure of Bhattacharyya Matrix in Natural Exponential Family and Its Role in Approximating the Variance of a Statistics

In most situations the best estimator of a function of the parameter exists, but sometimes it has a complex form and we cannot compute its variance explicitly. Therefore, a lower bound for the variance of an estimator is one of the fundamentals in the estimation theory, because it gives us an idea about the accuracy of an estimator. It is well-known in statistical inference that the Cram&eac...

متن کامل

A Berry-Esseen Type Bound for a Smoothed Version of Grenander Estimator

In various statistical model, such as density estimation and estimation of regression curves or hazard rates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametric statistics is to estimate a monotone density function f on a compact interval. A known estimator for density function of f under the restriction that f is decreasing, is Grenander estimator, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002